6 research outputs found

    Random Modulo: A new processor cache design for real-time critical systems

    Get PDF
    Cache memories have a huge impact on software's worst-case execution time (WCET). While enabling the seamless use of caches is key to provide the increasing levels of (guaranteed) performance required by automotive software, caches complicate timing analysis. In the context of Measurement-Based Probabilistic Timing Analysis (MBPTA) - a promising technique to ease timing analyis of complex hardware - we propose Random Modulo (RM), a new cache design that provides the probabilistic behavior required by MBPTA and with the following advantages over existing MBPTA-compliant cache designs: (i) an outstanding reduction in WCET estimates, (ii) lower latency and area overhead, and (iii) competitive average performance w.r.t conventional caches.Peer ReviewedPostprint (author's final draft

    Improving Measurement-Based Timing Analysis through Randomisation and Probabilistic Analysis

    Get PDF
    The use of increasingly complex hardware and software platforms in response to the ever rising performance demands of modern real-time systems complicates the verification and validation of their timing behaviour, which form a time-and-effort-intensive step of system qualification or certification. In this paper we relate the current state of practice in measurement-based timing analysis, the predominant choice for industrial developers, to the proceedings of the PROXIMA project in that very field. We recall the difficulties that the shift towards more complex computing platforms causes in that regard. Then we discuss the probabilistic approach proposed by PROXIMA to overcome some of those limitations. We present the main principles behind the PROXIMA approach as well as the changes it requires at hardware or software level underneath the application. We also present the current status of the project against its overall goals, and highlight some of the principal confidence-building results achieved so far

    Random Modulo: A new processor cache design for real-time critical systems

    No full text
    Cache memories have a huge impact on software's worst-case execution time (WCET). While enabling the seamless use of caches is key to provide the increasing levels of (guaranteed) performance required by automotive software, caches complicate timing analysis. In the context of Measurement-Based Probabilistic Timing Analysis (MBPTA) - a promising technique to ease timing analyis of complex hardware - we propose Random Modulo (RM), a new cache design that provides the probabilistic behavior required by MBPTA and with the following advantages over existing MBPTA-compliant cache designs: (i) an outstanding reduction in WCET estimates, (ii) lower latency and area overhead, and (iii) competitive average performance w.r.t conventional caches.Peer Reviewe

    Fitting processor architectures for measurement-based probabilistic timing analysis

    Get PDF
    The pressing market demand for competitive performance/cost ratios compels Critical Real-Time Embedded Systems industry to employ feature-rich hardware. The ensuing rise in hardware complexity however makes worst-case execution time (WCET) analysis of software programs \u2013 which is often required, especially for programs at the highest levels of integrity \u2013 an even harder challenge. State-of-the-art WCET analysis techniques are hampered by the soaring cost and complexity of obtaining accurate knowledge of the internal operation of advanced processors and the difficulty of relating data obtained from measurement observations with reliable worst-case behaviour. This frustrating conundrum calls for novel solutions, with low intrusiveness on development practice. Measurement-Based Probabilistic Timing Analysis (MBPTA) techniques offer the opportunity to simultaneously reduce the cost of acquiring the knowledge needed for computing reliable WCET bounds and gain increased confidence in the representativeness of measurement observations. This paper describes the changes required in the design of several high-performance features \u2013 massively used in modern processors \u2013 to meet MBPTA requirements

    Fitting processor architectures for measurement-based probabilistic timing analysis

    No full text
    The pressing market demand for competitive performance/cost ratios compels Critical Real-Time Embedded Systems industry to employ feature-rich hardware. The ensuing rise in hardware complexity however makes worst-case execution time (WCET) analysis of software programs - which is often required, especially for programs at the highest levels of integrity - an even harder challenge. State-of-the-art WCET analysis techniques are hampered by the soaring cost and complexity of obtaining accurate knowledge of the internal operation of advanced processors and the difficulty of relating data obtained from measurement observations with reliable worst-case behaviour. This frustrating conundrum calls for novel solutions, with low intrusiveness on development practice. Measurement-Based Probabilistic Timing Analysis (MBPTA) techniques offer the opportunity to simultaneously reduce the cost of acquiring the knowledge needed for computing reliable WCET bounds and gain increased confidence in the representativeness of measurement observations. This paper describes the changes required in the design of several high-performance features - massively used in modern processors - to meet MBPTA requirements.This work has received funding from the European Community's Seventh 1025 Framework Programme [FP7/2007-2013] under grant agreement 611085 (PROXIMA, www.proxima-project.eu). Support was also provided by the Ministry of Science and Technology of Spain under contract TIN2015-65316-P and the HiPEAC Network of Excellence. Leonidas Kosmidis is funded by the Spanish Ministry of Education under FPU grant AP2010-4208. Jaume Abella has been 1030 partially supported by the MINECO under Ramon y Cajal postdoctoral fellowship number RYC-2013-14717. The authors wish to acknowledge Michael Houston, Liliana Cucu-Grosjean and Luca Santinelli for contributing to the genesis of this work.Peer Reviewe

    PROXIMA: Improving Measurement-Based Timing Analysis through Randomisation and Probabilistic Analysis

    No full text
    The use of increasingly complex hardware and software platforms in response to the ever rising performance demands of modern real-time systems complicates the verification and validation of their timing behaviour, which form a time-and-effort-intensive step of system qualification or certification. In this paper we relate the current state of practice in measurement-based timing analysis, the predominant choice for industrial developers, to the proceedings of the PROXIMA (Probabilistic real-time control of mixed-criticality multicore systems) project in that very field. We recall the difficulties that the shift towards more complex computing platforms causes in that regard. Then we discuss the probabilistic approach proposed by PROXIMA to overcome some of those limitations. We present the main principles behind the PROXIMA approach as well as the changes it requires at hardware or software level underneath the application. We also present the current status of the project against its overall goals, and highlight some of the principal confidence-building results achieved so far.The research leading to these results has received funding from the European Community’s Seventh Framework Programme [FP7/2007-2013] under the PROXIMA Project (grant agreement 611085). Carles Hern´andez is jointly funded by the Spanish Ministry of Economy and Competitiveness (MINECO) and FEDER funds through grant TIN2014-60404-JIN. Jaume Abella has been partially supported by the MINECO under Ramon y Cajal postdoctoral fellowship number RYC-2013-14717.Peer Reviewe
    corecore